119.7K
Publications
8M
Citations
127.3K
Authors
10.8K
Institutions
Table of Contents
In this section:
In this section:
In this section:
In this section:
In this section:
In this section:
[3] Approximation Theory - an overview | ScienceDirect Topics — Publisher Summary. Approximation theory is the branch of mathematics which studies the process of approximating general functions by simple functions such as polynomials, finite elements or Fourier series. It therefore plays a central role in the analysis of numerical methods, in particular approximation of PDE's.
[4] Approximation theory - Encyclopedia of Mathematics — The main contents of approximation theory concerns the approximation of functions. Its foundations are laid by the work of P.L. Chebyshev (1854-1859) on best uniform approximation of functions by polynomials and by K. Weierstrass, who in 1885 established that in principle it is possible to approximate a continuous function on a finite
[5] Lecture outline for Approximation Theory - University of South Carolina — Introduction to Concepts of Approximation Theory Week 1 Lecture 1 (8/18): Main Issues of Approximation Theory: Choice of basic building components used for approximation: conditioned on problem being studied, e.g the physics in solving PDE, ... Overview description of algebraic and trigonometric polynomials, splines, wavelets.
[6] PDF — Approximation Theory lecture 12: Introduction to Approximation Theory Interpolation is an invaluable toolin numerical analysis: it provides an easy way to replace a complicated function by a polyno-mial (or piecewise polynomial), and, at least as importantly, it provides a mechanism for developing numerical algorithms for more sophis-ticated
[11] Rendering splines on GPU - Computer Graphics Stack Exchange — We have an application which needs to render spline curves (cubic, bezier, b-spline etc.). We currently have working algorithms in C to stroke the control points of these curves into line strips. The issue we are running to is the need to constantly re-stroke the curves based on zoom and how much of the curve is visible.
[12] PDF — The de Boor algorithm allows you to handle non-uniform B-splines, and gives more exibility in the sampling density, etc. Further, there may be memory access patterns that favor one algorithm over the other, and other issues likely arise if making GPU implementations of the algorithms.
[14] PDF — Although wavelets have their roots in approximation theory and signal processing , they have recently been applied to many problems in computer graphics. These graphics applications in-clude image editing , image compression , and image query-ing ; automatic level-of-detail control for editing and render-ing curves and surfaces ; surface reconstruction from con
[22] PDF — Using the Newton interpolating polynomial is usually the best choice. It has the advantage that data pairs can be added and interpolated by merely adding one additional term to the previous interpolat-ing polynomial.
[23] PDF — Piecewise polynomials provide alternative to practical and theoretical difficulties with high-degree polynomial interpolation Main advantage of piecewise polynomial interpolation is that large number of data points can be fit with low-degree polynomials
[24] Types Of Interpolation: A Complete Overview — Polynomial interpolation allows for smooth approximation, while nearest-neighbour is a simple method for fast estimation, Logarithmic and Lagrange's interpolation are suited for specific data types requiring more complexity to estimate. Choosing the appropriate interpolation technique can determine whether your estimations are accurate or flawed.
[25] What are some applications of linear approximation in the real world? — Besides Pedro answer, I would add application in control theory. We use control theory in robotics application as instance. The theory is developed for linear systems, but mechanical modeling is very non-linear, therefore, it is necessary use linear approximations for robotics.
[42] (PDF) The development of approximation theory and some proposed ... — The third stage in the development of approximation theory focused on approximative possibilities of the algebraic and trigonometric polynomials and rational fractions.
[46] The History of Approximation Theory: From Euler to Bernstein — This book aims to tell the historical evolution of the methods and results of approximation theory, starting from the work of Euler in 1777 on minimizing distance errors in maps of Russia and of Laplace in 1843 on finding the best ellipsoid for the earth, and ending with the the work of Bernstein.
[49] Bernstein polynomial - Wikipedia — Bernstein polynomials approximating a curve. In the mathematical field of numerical analysis, a Bernstein polynomial is a polynomial expressed as a linear combination of Bernstein basis polynomials.The idea is named after mathematician Sergei Natanovich Bernstein.. Polynomials in Bernstein form were first used by Bernstein in a constructive proof for the Weierstrass approximation theorem.
[52] Selected Works II: Probability Theory and Mathematical Statistics ... — The creative work of Andrei N. Kolmogorov is exceptionally wide-ranging. In his studies on trigonometric and orthogonal series, the theory of measure and integral, mathematical logic, approximation theory, geometry, topology, functional analysis, classical mechanics, ergodic theory, superposition of functions, and in formation theory, he solved many conceptual and fundamental problems and
[53] PDF — According to Grundbegriffe, the mathematical theory of probability studies probability measures, i.e., measures P on a measurable space (Ω;F) such that P(Ω) = 1. An event is just a set E 2 F and its probability is P(E). On this simple foundation Kolmogorov built a rich mathematical theory which has been developed by many researchers and is
[54] Andrey Kolmogorov: Foundations of AI Through Mathematics — Influence on Vapnik-Chervonenkis (VC) Dimension and Modern Learning Theory. Kolmogorov's foundational work in statistical learning also played a role in the development of Vapnik-Chervonenkis (VC) Theory, introduced by Vladimir Vapnik and Alexey Chervonenkis in the 1970s. The VC Dimension is a measure of the capacity of a hypothesis space
[60] The History of Approximation Theory: From Euler to Bernstein — From the reviews: "This book aims to tell the historical evolution of the methods and results of approximation theory, starting from the work of Euler in 1777 on minimizing distance errors in maps of Russia, and of Laplace in 1843 on finding the best ellipsoid for the earth, and ending with the work of Bernstein.
[61] The history of approximation theory: From euler to bernstein — The problem of approximating a given quantity is one of the oldest challenges faced by mathematicians. Its increasing importance in contemporary mathematics has created an entirely new area known as Approximation Theory. The modern theory was initially developed along two divergent schools of thought: the Eastern or Russian group, employing almost exclusively algebraic methods, was headed by
[65] Understanding the Kolmogorov-Arnold Network - Medium — While modern neural networks have moved beyond the direct use of the Kolmogorov-Arnold representation, the theorem still provides foundational insights into the function approximation capabilities
[66] Deep Kusuoka Approximation: High-Order Spatial Approximation for ... — The paper introduces a new deep learning-based high-order spatial approximation for a solution of a high-dimensional Kolmogorov equation where the initial condition is only assumed to be a continuous function and the condition on the vector fields associated with the differential operator is very general, i.e. weaker than Hörmander's hypoelliptic condition. In particular, the deep learning
[67] [2411.06078] A Survey on Kolmogorov-Arnold Network - arXiv.org — This systematic review explores the theoretical foundations, evolution, applications, and future potential of Kolmogorov-Arnold Networks (KAN), a neural network model inspired by the Kolmogorov-Arnold representation theorem. KANs distinguish themselves from traditional neural networks by using learnable, spline-parameterized functions instead of fixed activation functions, allowing for
[68] Bernstein Functions - De Gruyter — Bernstein functions appear in various fields of mathematics, e.g. probability theory, potential theory, operator theory, functional analysis and complex analysis - often with different definitions and under different names. Among the synonyms are `Laplace exponent' instead of Bernstein function, and complete Bernstein functions are sometimes called `Pick functions', `Nevanlinna functions' or
[77] Approximation Theory and Beyond (In honor of Larry Schumaker's 80th ... — Over time, Approximation Theory has increasingly become interdisciplinary. This topical collection is centered on recent advancements in approximation theory and its practical applications. It originated from the conference, "Approximation Theory and Beyond," held from May 15 to 18, 2023, at Vanderbilt University in Nashville, Tennessee.
[78] Systems and Soft Computing | ScienceDirect.com by Elsevier - Systems ... — The topics cover the recent advances in the fields of approximation theory and special functions. We will focus on the following: Approximation Theory: Classical approximation, statistical approximation, fuzzy approximation, approximation in the complex plane, best approximation, interpolation,
[80] Recent Advances in Constructive Approximation Theory — This book presents an in-depth study on advances in constructive approximation theory with recent problems on linear positive operators. State-of-the-art research in constructive approximation is treated with extensions to approximation results on linear positive operators in a post quantum and bivariate setting.
[83] Approximations of practice as a framework for understanding ... — This work is grounded in a theory of practice-based teacher education, in which teachers learn from engaging in and reflecting on their practice in several ways, including engagement in approximations of practice (Grossman et al., 2009). Approximations "include opportunities to rehearse and enact discrete components of complex practice in settings of reduced complexity" (p. 283). The
[84] Real-world Applications: Bridging the Gap Between Theory and Practice ... — Through hands-on activities designed to stimulate introspection, critical thinking, and the application of theoretical concepts, experiential learning is a dynamic method of teaching. Experiential learning, which was first introduced by educational theorists such as David Kolb and John Dewey, highlights the iterative process of learning via experience, introspection, conceptualization, and
[85] Approximation Theory - an overview | ScienceDirect Topics — Approximation theory is the branch of mathematics which studies the process of approximating general functions by simple functions such as polynomials, finite elements or Fourier series. It therefore plays a central role in the analysis of numerical methods, in particular approximation of PDE's.
[91] Approximation Theory and Related Applications | MDPI Books — In recent years, we have seen a growing interest in various aspects of approximation theory. This happened due to the increasing complexity of mathematical models that require computer calculations and the development of the theoretical foundations of the approximation theory. Approximation theory has broad and important applications in many areas of mathematics, including functional analysis
[92] Approximation Theory and Applications - ScienceDirect — Approximation Theory and Applications: Piecewise Linear and Generalized Functions presents the main provisions of approximation theory, and considers existing and new methods for approximating piecewise linear and generalized functions, widely used to solve problems related to mathematical modeling of systems, processes, and phenomena in fields ranging from engineering to economics.
[93] Approximation theory - Department of Mathematics - The University of ... — Approximation theory is a key component of contemporary algorithms used in computational science and engineering. The mathematical theory underlying approximation has a crucial role in the design of efficient computational algorithms for representing real-life data and the numerical solution of differential equations, by providing insight into
[94] Approximation Theory and Beyond (In honor of Larry Schumaker's 80th ... — Over time, Approximation Theory has increasingly become interdisciplinary. This topical collection is centered on recent advancements in approximation theory and its practical applications. It originated from the conference, "Approximation Theory and Beyond," held from May 15 to 18, 2023, at Vanderbilt University in Nashville, Tennessee.
[95] Recent Developments in Spectral and Approximation Theory — This book is a collection of recent developments in spectral and approximation theory. The results collected here were presented at the International Conference on Spectral and Approximation Theory (ICSAT-2023) which took place at Cochin University of Science and Technology in Kerala, India. The conference ICSAT-2023 focuses on two significant
[122] Approximation theory - Wikipedia — In mathematics, approximation theory is concerned with how functions can best be approximated with simpler functions, and with quantitatively characterizing the errors introduced thereby. What is meant by best and simpler will depend on the application.. A closely related topic is the approximation of functions by generalized Fourier series, that is, approximations based upon summation of a
[125] What is: Approximation Theory Explained in Detail — Over the years, the theory has evolved significantly, incorporating various mathematical tools and concepts, including splines, wavelets, and Fourier series, which have broadened its applicability across different scientific domains. Key Concepts in Approximation Theory
[128] Approximation theory and its applications in engineering - Complete Phd ... — 5.4 Practical Implications. Brief Overview: Approximation theory is a branch of mathematics that deals with the approximation of complex functions by simpler functions. It has many applications in engineering, where it is used to simplify complex models and make them more computationally efficient.
[129] Approximation Theory and Applications | ScienceDirect — Approximation Theory and Applications: Piecewise Linear and Generalized Functions presents the main provisions of approximation theory, and considers existing and new methods for approximating piecewise linear and generalized functions, widely used to solve problems related to mathematical modeling of systems, processes, and phenomena in fields ranging from engineering to economics.
[131] P. L. Chebyshev (1821-1894) and his contacts with ... - ScienceDirect — Weierstrass who, together with Chebyshev, became the second co-founder of approximation theory (with the theorem of 1885 named after him), already criticized Chebyshev's methods of 1857 in that same year; he preferred to solve the problem using Jacobi's theory of elliptic functions, which gave a "clearer and deeper insight into the essence of
[133] The History of Approximation Theory - Springer — The final chapter emphasizes the important work of the Russian Jewish mathematician Sergei Bernstein, whose constructive proof of the Weierstrass theorem and extension of Chebyshev's work serve to unify East and West in their approaches to approximation theory.
[138] Definitions, methods, and applications in interpretable machine learning — In the context of ML, there are 2 areas where errors can arise: when approximating the underlying data relationships with a model (predictive accuracy) and when approximating what the model has learned using an interpretation method (descriptive accuracy). Post hoc interpretability (Section 6) involves using methods to extract information from a trained model (with no effect on predictive accuracy). Different model-based interpretability methods provide different ways of increasing descriptive accuracy by constructing models which are easier to understand, sometimes resulting in lower predictive accuracy. Thus, an effective way of increasing the potential uses for model-based interpretability is to devise new modeling methods which produce higher predictive accuracy while maintaining their high descriptive accuracy and relevance.
[140] On the Exact Evaluation of Integrals of Wavelets - MDPI — Wavelet expansions are a powerful tool for constructing adaptive approximations. For this reason, they find applications in a variety of fields, from signal processing to approximation theory. Wavelets are usually derived from refinable functions, which are the solution of a recursive functional equation called the refinement equation.
[141] Spline approximation - (Intro to Engineering) - Fiveable — Spline approximation is a numerical method used to construct a piecewise polynomial function, known as a spline, that can closely approximate a given set of data points. This technique is particularly useful in interpolation and smoothing of data, allowing for more flexible and accurate modeling compared to traditional polynomial fitting.
[142] Spline approximation - Encyclopedia of Mathematics — Methods of spline approximation are closely connected with the numerical solution of partial differential equations by the finite-element method, which is based on the Ritz method with a special choice of basis functions. In this method, one chooses piecewise-polynomial functions (i.e. splines, cf. Spline) as basis functions.
[143] Spline interpolation - Wikipedia — In the mathematical field of numerical analysis, spline interpolation is a form of interpolation where the interpolant is a special type of piecewise polynomial called a spline. That is, instead of fitting a single, high-degree polynomial to all of the values at once, spline interpolation fits low-degree polynomials to small subsets of the values, for example, fitting nine cubic polynomials
[145] PDF — Spline Interpolation We've approached the interpolation problem by choosing (high-degree) polyno-mials for our basis functions φi : f(x) = n j=0 cjφj(x). This approach can be efficient (recall the barycentric form of the Lagrange interpolant), but using high degree poly-nomials can lead to large errors due to erratic oscillations, especially near the interval endpoints.
[151] Performance improvement of machine learning models via wavelet theory ... — Wavelets are also able to separate fine details within a signal in a manner similar to an enhanced Fourier transform (Sifuzzaman et al ... Performance evaluation of models. ... pre-processing with EMD or EEMD will improve the performance of machine learning models for estimation of river streamflow (Huang et al., Citation 2014; Liu et al
[156] 2 - Polynomials and spline functions as approximating tools — They have definite advantages, in comparison with polynomials, for computer realizations and, moreover, it turns out that they are the best approximation tool in many important cases. In this chapter we give the general properties of polynomials and polynomial splines that it will be necessary to know in the rest of the book.
[157] Spline approximation - Encyclopedia of Mathematics — These methods were studied before problems of best approximation by splines, and attention has centred on approximation by interpolation splines (cf. Interpolation spline; see , , ). These often give the same order of approximation as splines of best approximation, which is one of the advantages over interpolation by polynomials.
[158] When is cubic spline interpolation better than an interpolating polynomial? — The following plot is a slight variation of an example in a text book. The author used this example to illustrate that an interpolating polynomial over equally spaced samples has large oscillations near the ends of the interpolating interval. Of course cubic spline interpolation gives a good approximation over the whole interval.
[159] What are the advantages / disadvantages of using splines, smoothed ... — $\begingroup$ In a nutshell: use splines if your curve has to pass through the given data (with the implicit assumption that your data is "exact"), and use smoothing splines if you have "noisy" data and you want to get a feel for how the data varies. FWIW, my opinion is that smoothing data for purposes of display is cheating; show the noisy data, and figure out why your data is noisy to begin
[164] Approximation Methods: Theory and Applications: Journal of Function Spaces — Approximation theory is one of the most active research areas because of its crucial applications in many branches of science. The theory has a role in both mathematical sciences (e.g. constructive approximation of functions, solutions of partial and integral equations, etc) and engineering sciences (e.g. computer-aided geometric design, image processing, etc).
[165] Approximation Theory, Wavelets and Applications | SpringerLink — Approximation Theory, Wavelets and Applications draws together the latest developments in the subject, provides directions for future research, and paves the way for collaborative research. The main topics covered include constructive multivariate approximation, theory of splines, spline wavelets, polynomial and trigonometric wavelets, interpolation theory, polynomial and rational approximation.
[166] Approximation Theory and Related Applications - James Madison University — In recent years, we have seen a growing interest in various aspects of approximation theory. This happened due to the increasing complexity of mathematical models that require computer calculations and the development of the theoretical foundations of the approximation theory. Approximation theory has broad and important applications in many areas of mathematics, including functional analysis
[167] Approximation Theory | Algor Cards - Algor Education — Approximation theory includes a repertoire of formulas and methods that are essential for applying mathematical principles to practical problems. The Taylor Series, for example, allows for the approximation of functions using an infinite sum of terms calculated from the function's derivatives at a single point.
[168] Approximation Methods in Science and Engineering — Approximation Methods in Engineering and Science covers fundamental and advanced topics in three areas: Dimensional Analysis, Continued Fractions, and Stability Analysis of the Mathieu Differential Equation. Throughout the book, a strong emphasis is given to concepts and methods used in everyday calculations.
[169] PDF — polynomials of two variables instead of orthogonal polynomials, in considering rounded and cutting blocks in 3D polynomial approximation. Key words: Local Polynomial Approximation, image processing, image filtration, gradient, image resize, video filtration, image convolution, convolution mask, noise filtration, facet model 1. Introduction.
[176] (PDF) The development of approximation theory and some proposed ... — Then, we provide some open problems of approximation theory in mechanical engineering and ophthalmology, through examples in gear transmission and medical optics.
[184] Approximation Theory in Machine Learning - Restackio — Approximation theory plays a crucial role in the development and understanding of machine learning models. It provides the mathematical foundation for how well a model can approximate a function based on a finite set of data points.
[185] A Gentle Introduction To Approximation - MKAI — When it comes to machine learning tasks such as classification or regression, approximation techniques play a key role in learning from the data. Many machine learning methods approximate a function or a mapping between the inputs and outputs via a learning algorithm.
[186] Understanding the Universal Approximation Theorem | by Machine Learning ... — Here, we explore the UAT's fundamental role in neural network theory, emphasizing its guarantee that a suitably designed network can approximate any function, given enough neurons and the right
[187] Optimization of Neural Network Training for Image Recognition Based on ... — The use of polynomials for the approximation of multidimensional functions opens up new opportunities for studying ANNs. For example, orthogonal transformations have applications for image and speech signal processing, feature selection in image recognition, generalized Wiener filtering, and spectroscopy . An ANN can be considered not only as
[188] Universal Approximation Theory: The Basic Theory for Deep Learning ... — range of applications, encompassing tasks such as image segmentation (Minaee et al. 2021; Wang et al. 2022a), clas-sification (Rawat and Wang 2017; Wang et al. 2017), video synthesis (Wang et al. 2018, 2019a), and restoration (Wang et al. 2019b; Nah et al. 2019). This broad applicability high-lights its enormous technical potential and
[198] Mathematical and Computational Methods for Modelling, Approximation and ... — This book contains plenary lectures given at the International Conference on Mathematical and Computational Modeling, Approximation and Simulation, dealing with three very different problems: reduction of Runge and Gibbs phenomena, difficulties arising when studying models that depend on the highly nonlinear behaviour of a system of PDEs, and data fitting with truncated hierarchical B-splines
[238] Mathematical Analysis, Approximation Theory and Their Applications — Recent and significant developments in approximation theory, special functions and q-calculus along with their applications to mathematics, engineering, and social sciences are discussed and analyzed. Each chapter enriches the understanding of current research problems and theories in pure and applied research.
[239] Approximation Theory - an overview | ScienceDirect Topics — Approximation theory is the branch of mathematics which studies the process of approximating general functions by simple functions such as polynomials, finite elements or Fourier series. ... This area of mathematics is extremely important and includes the subjects of least-squares methods, minimal projections, orthogonal polynomials, optimal
[240] What is: Approximation Theory Explained in Detail — Approximation Theory is a branch of mathematical analysis that focuses on how functions can be approximated with simpler, more manageable functions. This field is essential in various applications, including numerical analysis, computer science, and engineering, where exact solutions are often impractical or impossible to obtain.
[242] What is: Approximation Theory Explained in Detail — Challenges in Approximation Theory. Despite its usefulness, Approximation Theory faces several challenges, particularly when dealing with high-dimensional spaces or non-linear functions. The curse of dimensionality can make it increasingly difficult to find effective approximations as the number of dimensions increases.
[243] Approximation Theory and Applications | ScienceDirect — However, challenges often arise when constructing solutions over the entire domain of these functions, requiring the use special mathematical methods to put theory into practice. ... Approximation Theory and Applications: Piecewise Linear and Generalized Functions presents the main provisions of approximation theory, and considers existing and
[244] Approximation Theory and Approximation Practice - University of Oxford — There is a bias toward theorems and methods for analytic functions, which appear so often in practical approximation, rather than on functions at the edge of discontinuity with their seductive theoretical challenges. Original sources are cited rather than textbooks, and each item in the 27-page bibliography is annotated with an editorial comment.
[246] Approximation Theory, Wavelets and Applications | SpringerLink — Approximation Theory, Wavelets and Applications draws together the latest developments in the subject, provides directions for future research, and paves the way for collaborative research. The main topics covered include constructive multivariate approximation, theory of splines, spline wavelets, polynomial and trigonometric wavelets, interpolation theory, polynomial and rational
[248] Overcoming the Curse of Dimensionality in Reinforcement Learning ... — Reinforcement Learning (RL) algorithms are known to suffer from the curse of dimensionality, which refers to the fact that large-scale problems often lead to exponentially high sample complexity. A common solution is to use deep neural networks for function approximation; however, such approaches typically lack theoretical guarantees. To provably address the curse of dimensionality, we observe
[249] PDF — Another area where the curse of dimensionality has been an essential obstacle is machine learning and data analysis, where the complexity of nonlinear regression models, for example, goes up exponentially with the dimensionality. In both cases the essential problem we face is how to represent or approximate a nonlinear function in high dimensions.
[252] Major advancements in kernel function approximation — Kernel based methods have become popular in a wide variety of machine learning tasks. They rely on the computation of kernel functions, which implicitly transform the data in its input space to data in a very high dimensional space. Efficient application of these functions have been subject to study in the last 10 years. The main focus was on improving the scalability of kernel based methods
[254] Reinforcement learning algorithms with function approximation: Recent ... — In recent years, the research on reinforcement learning (RL) has focused on function approximation in learning prediction and control of Markov decision processes (MDPs). Secondly, learning control algorithms with function approximation are surveyed, where the main focus is put on highly efficient RL algorithms with function approximation such as fitted-Q iteration, approximate policy iteration, and adaptive critic designs (ACDs). RL algorithms with function approximation for learning control in MDPs Until recently, the three main categories of approximate RL methods for learning control have included value function approximation (VFA) , policy search , and actor–critic methods , , . As an interdisciplinary area, RL and ADP algorithms with function approximation have attracted many research interests from different domains such as machine learning, control theory, operations research, and robotics.
[255] Statistical challenges of high-dimensional data - PMC — The practical and theoretical challenges posed by the large p /small n settings, along with the ferment of recent research, formed the backdrop to the 2008 research programme 'Statistical Theory and Methods for Complex, High-dimensional Data' at the Isaac Newton Institute for Mathematical Sciences, which stimulated this Theme Issue.
[257] Approximation Theory and Approximation Practice - University of Oxford — There is a bias toward theorems and methods for analytic functions, which appear so often in practical approximation, rather than on functions at the edge of discontinuity with their seductive theoretical challenges. Original sources are cited rather than textbooks, and each item in the 27-page bibliography is annotated with an editorial comment.
[258] PDF — Interpolation by polynomials and rational functions ofdiscontinuous functions is an historical approach and well-studied. Two well-known phenomena are theRunge and Gibbseffects [Runge 1901, Gibbs 1899]. Interpolation by kernels, mainlyRadial Basis Functions, are suitable for high-dimensional scattered data problems [Hardy 1971, MJD
[260] Multivariate neural network operators with sigmoidal activation ... — Constructive multivariate approximation algorithms based on sigmoidal functions are important since they play a central role in typical applications of neurocomputing processes concerning high-dimensional data. Applications of NNs with sigmoidal functions in Numerical Analysis, for instance, to the numerical solution of Volterra integral and integro-differential equations by suitable
[261] Multivariate numerical approximation using constructive — Various problems concerning the applications in many different disciplines such as computer science, engineering, and physics can be converted into the problems of approximating multivariate functions, and starts more and more involving to the four-dimensional or more than four-dimensional data hypersurface fitting actual problem, like the
[269] [1912.04310] Efficient approximation of high-dimensional functions with ... — In this paper, we develop a framework for showing that neural networks can overcome the curse of dimensionality in different high-dimensional approximation problems. Our approach is based on the notion of a catalog network, which is a generalization of a standard neural network in which the nonlinear activation functions can vary from layer to layer as long as they are chosen from a predefined
[276] A Review of Recent Advances in Surrogate Models for Uncertainty ... — A Review of Recent Advances in Surrogate Models for Uncertainty Quantification of High-Dimensional Engineering Applications - ScienceDirect A Review of Recent Advances in Surrogate Models for Uncertainty Quantification of High-Dimensional Engineering Applications Challenges in surrogate modeling for high-dimensional spaces are comprehended. High-dimensional benchmark functions assessing the surrogate models are provided. Nonetheless, as the complexity of the problem increases and the number of input variables grows, the computational burden of constructing an efficient surrogate model also rises, leading to the so-called curse of dimensionality in uncertainty propagation from inputs to outputs. This paper reviews the developments of the past years in surrogate modeling for high-dimensional inputs, with the goal of quantifying output uncertainty. For all open access content, the relevant licensing terms apply.
[277] Regression and Classification With Spline-Based Separable Expansions — Approximating multivariate functions in high dimensions quickly becomes infeasible due to the curse of dimensionality. ... B-splines satisfy, in comparison to approximation by pure polynomials, some favorable properties as they are compactly supported—B-splines are nonzero only on a small interval—and allow for more adaptive local